1,781 research outputs found

    Propositional computability logic I

    Full text link
    In the same sense as classical logic is a formal theory of truth, the recently initiated approach called computability logic is a formal theory of computability. It understands (interactive) computational problems as games played by a machine against the environment, their computability as existence of a machine that always wins the game, logical operators as operations on computational problems, and validity of a logical formula as being a scheme of "always computable" problems. The present contribution gives a detailed exposition of a soundness and completeness proof for an axiomatization of one of the most basic fragments of computability logic. The logical vocabulary of this fragment contains operators for the so called parallel and choice operations, and its atoms represent elementary problems, i.e. predicates in the standard sense. This article is self-contained as it explains all relevant concepts. While not technically necessary, however, familiarity with the foundational paper "Introduction to computability logic" [Annals of Pure and Applied Logic 123 (2003), pp.1-99] would greatly help the reader in understanding the philosophy, underlying motivations, potential and utility of computability logic, -- the context that determines the value of the present results. Online introduction to the subject is available at http://www.cis.upenn.edu/~giorgi/cl.html and http://www.csc.villanova.edu/~japaridz/CL/gsoll.html .Comment: To appear in ACM Transactions on Computational Logi

    Multiple Particle Interference and Quantum Error Correction

    Full text link
    The concept of multiple particle interference is discussed, using insights provided by the classical theory of error correcting codes. This leads to a discussion of error correction in a quantum communication channel or a quantum computer. Methods of error correction in the quantum regime are presented, and their limitations assessed. A quantum channel can recover from arbitrary decoherence of x qubits if K bits of quantum information are encoded using n quantum bits, where K/n can be greater than 1-2 H(2x/n), but must be less than 1 - 2 H(x/n). This implies exponential reduction of decoherence with only a polynomial increase in the computing resources required. Therefore quantum computation can be made free of errors in the presence of physically realistic levels of decoherence. The methods also allow isolation of quantum communication from noise and evesdropping (quantum privacy amplification).Comment: Submitted to Proc. Roy. Soc. Lond. A. in November 1995, accepted May 1996. 39 pages, 6 figures. This is now the final version. The changes are some added references, changed final figure, and a more precise use of the word `decoherence'. I would like to propose the word `defection' for a general unknown error of a single qubit (rotation and/or entanglement). It is useful because it captures the nature of the error process, and has a verb form `to defect'. Random unitary changes (rotations) of a qubit are caused by defects in the quantum computer; to entangle randomly with the environment is to form a treacherous alliance with an enemy of successful quantu

    Turing's three philosophical lessons and the philosophy of information

    Get PDF
    In this article, I outline the three main philosophical lessons that we may learn from Turing's work, and how they lead to a new philosophy of information. After a brief introduction, I discuss his work on the method of levels of abstraction (LoA), and his insistence that questions could be meaningfully asked only by specifying the correct LoA. I then look at his second lesson, about the sort of philosophical questions that seem to be most pressing today. Finally, I focus on the third lesson, concerning the new philosophical anthropology that owes so much to Turing's work. I then show how the lessons are learned by the philosophy of information. In the conclusion, I draw a general synthesis of the points made, in view of the development of the philosophy of information itself as a continuation of Turing's work. This journal is © 2012 The Royal Society.Peer reviewe

    Diffusion-induced spontaneous pattern formation on gelation surfaces

    Full text link
    Although the pattern formation on polymer gels has been considered as a result of the mechanical instability due to the volume phase transition, we found a macroscopic surface pattern formation not caused by the mechanical instability. It develops on gelation surfaces, and we consider the reaction-diffusion dynamics mainly induces a surface instability during polymerization. Random and straight stripe patterns were observed, depending on gelation conditions. We found the scaling relation between the characteristic wavelength and the gelation time. This scaling is consistent with the reaction-diffusion dynamics and would be a first step to reveal the gelation pattern formation dynamics.Comment: 7 pages, 4 figure

    An evolutionary model with Turing machines

    Full text link
    The development of a large non-coding fraction in eukaryotic DNA and the phenomenon of the code-bloat in the field of evolutionary computations show a striking similarity. This seems to suggest that (in the presence of mechanisms of code growth) the evolution of a complex code can't be attained without maintaining a large inactive fraction. To test this hypothesis we performed computer simulations of an evolutionary toy model for Turing machines, studying the relations among fitness and coding/non-coding ratio while varying mutation and code growth rates. The results suggest that, in our model, having a large reservoir of non-coding states constitutes a great (long term) evolutionary advantage.Comment: 16 pages, 7 figure

    Learning, Social Intelligence and the Turing Test - why an "out-of-the-box" Turing Machine will not pass the Turing Test

    Get PDF
    The Turing Test (TT) checks for human intelligence, rather than any putative general intelligence. It involves repeated interaction requiring learning in the form of adaption to the human conversation partner. It is a macro-level post-hoc test in contrast to the definition of a Turing Machine (TM), which is a prior micro-level definition. This raises the question of whether learning is just another computational process, i.e. can be implemented as a TM. Here we argue that learning or adaption is fundamentally different from computation, though it does involve processes that can be seen as computations. To illustrate this difference we compare (a) designing a TM and (b) learning a TM, defining them for the purpose of the argument. We show that there is a well-defined sequence of problems which are not effectively designable but are learnable, in the form of the bounded halting problem. Some characteristics of human intelligence are reviewed including it's: interactive nature, learning abilities, imitative tendencies, linguistic ability and context-dependency. A story that explains some of these is the Social Intelligence Hypothesis. If this is broadly correct, this points to the necessity of a considerable period of acculturation (social learning in context) if an artificial intelligence is to pass the TT. Whilst it is always possible to 'compile' the results of learning into a TM, this would not be a designed TM and would not be able to continually adapt (pass future TTs). We conclude three things, namely that: a purely "designed" TM will never pass the TT; that there is no such thing as a general intelligence since it necessary involves learning; and that learning/adaption and computation should be clearly distinguished.Comment: 10 pages, invited talk at Turing Centenary Conference CiE 2012, special session on "The Turing Test and Thinking Machines

    Pattern Formation Induced by Time-Dependent Advection

    Full text link
    We study pattern-forming instabilities in reaction-advection-diffusion systems. We develop an approach based on Lyapunov-Bloch exponents to figure out the impact of a spatially periodic mixing flow on the stability of a spatially homogeneous state. We deal with the flows periodic in space that may have arbitrary time dependence. We propose a discrete in time model, where reaction, advection, and diffusion act as successive operators, and show that a mixing advection can lead to a pattern-forming instability in a two-component system where only one of the species is advected. Physically, this can be explained as crossing a threshold of Turing instability due to effective increase of one of the diffusion constants

    Two-Bit Messages are Sufficient to Implement Atomic Read/Write Registers in Crash-prone Systems

    Get PDF
    Atomic registers are certainly the most basic objects of computing science. Their implementation on top of an n-process asynchronous message-passing system has received a lot of attention. It has been shown that t \textless{} n/2 (where t is the maximal number of processes that may crash) is a necessary and sufficient requirement to build an atomic register on top of a crash-prone asynchronous message-passing system. Considering such a context, this paper presents an algorithm which implements a single-writer multi-reader atomic register with four message types only, and where no message needs to carry control information in addition to its type. Hence, two bits are sufficient to capture all the control information carried by all the implementation messages. Moreover, the messages of two types need to carry a data value while the messages of the two other types carry no value at all. As far as we know, this algorithm is the first with such an optimality property on the size of control information carried by messages. It is also particularly efficient from a time complexity point of view

    Differentiation and Replication of Spots in a Reaction Diffusion System with Many Chemicals

    Full text link
    The replication and differentiation of spots in reaction diffusion equations are studied by extending the Gray-Scott model with self-replicating spots to include many degrees of freedom needed to model systems with many chemicals. By examining many possible reaction networks, the behavior of this model is categorized into three types: replication of homogeneous fixed spots, replication of oscillatory spots, and differentiation from `m ultipotent spots'. These multipotent spots either replicate or differentiate into other types of spots with different fixed-point dynamics, and as a result, an inhomogeneous pattern of spots is formed. This differentiation process of spots is analyzed in terms of the loss of chemical diversity and decrease of the local Kolmogorov-Sinai entropy. The relevance of the results to developmental cell biology and stem cells is also discussed.Comment: 8 pages, 12 figures, Submitted to EP

    Formation of regular spatial patterns in ratio-dependent predator-prey model driven by spatial colored-noise

    Full text link
    Results are reported concerning the formation of spatial patterns in the two-species ratio-dependent predator-prey model driven by spatial colored-noise. The results show that there is a critical value with respect to the intensity of spatial noise for this system when the parameters are in the Turing space, above which the regular spatial patterns appear in two dimensions, but under which there are not regular spatial patterns produced. In particular, we investigate in two-dimensional space the formation of regular spatial patterns with the spatial noise added in the side and the center of the simulation domain, respectively.Comment: 4 pages and 3 figure
    • 

    corecore